Skip to main content
10:00 am
-- the state of women's rights worldwide and who will pick up the mantle on the issue with secretary of state hillary clinton departing her post as secretary of state. i want to thank everybody who participated in this edition of the "washington journal." we look forward to seeing you all again tomorrow morning sunday at 7:00 a.m. eastern time. >> now a discussion and facial recognition technology and privacy issues. after that, oregon senator ron wyden talks about global issues with the internet. and then south carolina governor immediately.
10:01 am
-- nikki haley. not a discussion on facial recognition technology and the privacy issues that arise as it becomes more widespread this is about one hour 20 minutes. >> i am technology reporter for political. i have a great panel so i will not bore you with an introduction. to start us off we have the ftc commissioner who was sworn in on a term that expires in 2018. she focuses on fcc issues, including privacy.
10:02 am
she served at the commission for -- she focuses on f.t.c. issues, including privacy. she would get us started off with a recap on what the ftc is working on. >> i am delighted for the opportunity to provide some into the three stocks on the topic of this panel, facial recognition technology. i will be speaking from the perspective of consumers. the mission is to prevent business practices that are anti-competitive, deceptive, or on fred to consumers without unduly burdening legitimate -- except if our -- be sacked if --
10:03 am
deceptive. first, i would like to note that my comments on my own and do not necessarily represent respect the lives of other ftc commissioners. i cannot miss the opportunity to quote the line, "john anderton you could, you say guinness right now. -- use a guiness. now.." companies are already deploying facial recognition technology in a wide variety of contexts. some are more sophisticated than others. there has been a lot of coverage of the panasonic >> that pops up
10:04 am
your content based on different members in the house will -- members of the household. one of the things i like to do is to be careful to make sure we are not lumping different kinds of facial recognition technology into one bucket, but to understand them individually. this technology can be used for facial detection. here is a face. locate the face in a photograph. we are locating face is to blur of them, ensuring that chad seats to include a face, and not something else that could be -- chat feeds a face, and not something else that could be
10:05 am
disturbing. companies can place cameras into digital signs to determine demographic characteristics such as age range and deliver targeted advertising based on that consumer's demographic profile. it might show a 30-year-old man in an advertisement for shaving cream and a woman might be shown an advertisement for perfume. one company has leveraged his ability to determine age, race,
10:06 am
and gender to determine average of demographic data. cameras played to the entrances at these venues. there are not images of the customer, but the way they use it are for vendors and third parties such as liquor distributors to understand the demographics of a particular vineyard's customers and taylor of the specials -- vendor's customers and tailo specialr saw to that particular -- and tailor their specials to that particular demographic. consumers can make decisions about which the vendor to patronize.
10:07 am
and even more sophisticated application is currently in use. companies use technology to compare an individual's facial characteristics across different images to identify him or her. an image of an individual is matched with other images of the same individual. if the face of each individual can be identified by name, a previously anonymous face can be identified in and of the photograph. that raises more privacy implications. why is this happening now. technology is advancing. certain technologies are in mind. why is this coming to the forefront now? facial recognition technology is or not widely used on
10:08 am
commercial basis. we have had an improvement in the higher quality digital cameras and by a metric data can be more easily extract. -- biometric data can be more easily extracted. if fund wraps were taken from different angles, it was difficult to match -- in photographs were taken from different and those, it was difficult to match them. -- the french angles -- different angles, it was difficult to match them. there are more identifiable
10:09 am
images of privacy -- private images online. approximately 2.5 billion totals are uploaded to facebook each month. it is an amazing day the senate. amazing data set. i am sure you are all familiar with the ability to attack people in photographs. that came to the forefront of my mind when the president nominated me to be an ftc commissioner. i thought, what photographs are they out there of three? what do i look like in that photograph -- i thought, what photographs are out there of three -- of me?
10:10 am
facial red stick -- facial recognition technology uses data about the size of your face and your eyes. at the ftc, we like to pay attention to what new issues might be coming down the road for consumers so we might have an understanding of what some of these benefits and risks might be for consumers. we have an active policy-making function at the ftc. we did a workshop in december of 2011 on facial recognition technology. we issued a report called to be facing facts report. it recommended best practices for companies that use facial recognition technology. we had the workshop and we received a number of comments
10:11 am
and we issued this report setting up best practices. the reports is not intended to serve as a template for law enforcement or act as regulation for current law, but as a way to explore current issues. a lot of this was based on our march 2012 report protecting consumer privacy in an era of graphic change. we called upon companies to implement privacy by design and to provide consumers with simplified choices to increase transparency by providing clearer, joyner, and more standardized privacy analysis. -- shorter and more standardized privacy notices. i wanted to give you the
10:12 am
contents of a preview of what of the technology issues are and some of being thought c-f-t-c is bringing to bear in this area. i have not -- some of these thoughts the ftc is bringing to bear in this area. later, we will take audience questions. during the rest of this panel in. we have a great group. it is one of those days. some of us did have a guinness last night. and have the council with congresswoman lofgren. i want to start out talking with the -- about the ftc report.
10:13 am
there are a few lines in the report that speaks directly to what consumers do and do not know. consumers likely do not expect science to detect their age and gender and targeted to them at a certain time. they may not expect some of those photographs are stored by social networks. with that in mind, give us your perspective on facial recognition privacy. what do you see are the issues at play? >> we got a face prince of all of you coming in and we will need a dna sample on the way out. we always start out on these discussion with the quote from the minority report movie the
10:14 am
problem was not that the technology could identify the individual, but that it came in through the door. there are fourth amendment concerns that matter with respect to biometrics. face cameras remain a big debate. the private sector is capable of making new security guarantees and privacy guarantees. we have this unfortunate guarantee of being in a homeland security culture. we will talk about this as a yield on the line. there was a wall street journal piece about the federal government. in national id card comes up
10:15 am
occasionally in these debates. potentially compulsory is the cameras out in public. public them of using technology biom informationetri collectionc is taking place. the information collection against a targeted individual should always be in the database and the camera may be looking for a match. you asked about expectation. consumers do not expect they are being put in databases and track where the ago. those questions about the fourth amendment and that -- protecting us in our homes. it should also protect us in other places. the private sector can make you those security guarantees. in a homeland security culture, the horse is already out of the barn. if companies want to keep their databases private, would they be
10:16 am
required to share that with the federal government. here is the new national tourism information center. i have a lot of faith in the potential of technology to do good things. when we learn compulsorily with voluntary, we run into problems. >> there is a tendency to fall prey to moral panic. people forget that the issues are not entirely new. facial recognition is photography plus math. that is not the end of the story. what we are really talking about its identification plus
10:17 am
abuse. there are two steps. the first two steps are free speech. i believe in an open and transparent society. i think it is a good thing that people who wanted to live in and out in world for scott of the loss -- before photography lost. it is part of the understanding of facts around us. it is about how we understand the world. it is an opportunity for us to protect consumers. it is on the step that comes after that. it is on the uses of information. we have been talking about the uses by government. it is not just a national security world.
10:18 am
it is also law enforcement. national security uses can harm individual users. it does not deal issues. government should be required to get a warrant before it correlates two different databases. this happened in britain when they started to correlate the data base of driver's license photos with images taken from closed-circuit televisions. that is what the fourth amendment comes into play. there are standards of evidence. if you are in one of those riots in britain, you might have on your cell phone charts in courts because you looked like somebody who was throwing a brick through a window. maybe congress has a role. i am concerned about the other set of uses that were alluded
10:19 am
to, uses by the private sector. consumers are uncomfortable with this technology. they do not expect it is being used. it is different from saying we have to do something about this technology because it can harm people the federal trade commission rightly protect consumers against deception and unfair practices. if we stick to that basic framework, we can deal with harmful uses of facial recognition without trying to restrict the underlying photography and math that i think we should that happen freely. we should protect consumers from uses of their information to contact them for credit purposes.
10:20 am
we have a rich body of law. we also have property law. we have ways of dealing with .roblems like intrusions making sure companies do not place cameras in sensitive places. i would refer all of you to the professor at arizona state law school. she has done good work on free speech and privacy. , but also more recently on the torch of confusion. she offers arguments about why having a camera to facial recognition in a bath in is a problem. we can develop a new and exciting technology and not fall
10:21 am
prey to a moral panic that comes up in privacy. >> talk about consumer expectations going back to the report that consumers thought was being collected. >> i wanted to talk about 70% of the thick and -- facial recognition market where people are actually affirmatively identified and then matched. 70% of that is government and law enforcement. this is a worldwide said the best worldwide systems. the number of investment is approximately $6 billion. you may be able to advance that number a little bit. that is for u.s. dollars in 2012. in 2014, the number is expected
10:22 am
to rise to $9 billion. the technology is moving to private retail for shoplifting prevention purposes. this is an important starting place. the first rule is you must tailor the biometric to the place you are working in. how it is being deployed changes how you can and that technology. i will give you a couple of examples. one example would be in kenya
10:23 am
where biometric kit sans to create -- kits were sent to advance an international identity card. it was semi voluntary. he knew the to the you can and because of the protection of freedom act, they had their first surveillance commission and who came out this last fall and said, i am former police officer. absolutely no one expects this technology to be working the way it is. when we walked into wall grains
10:24 am
or cvs -- walgreen's or cvs -- if you at other nations in asia, it gets higher, especially in china. we need to be careful of the government louis. the technology developed by the government to the open market. this happens in the u.s. in an orderly way with a lot of rights. in other parts of the world, it is not moving in such a nice
10:25 am
way. this is one of our great concern with facial recognition technology. because of the way the internet works, the free-form commercial uses are the way back into the u.s. through technology that is simply worldwide at this point. that is a concern. he on the king is consumer expectation of privacy. i have come to the -- the other thing is consumer expectation of privacy. if we base things on what consumers expect, we will have won a 1 million consumer privacy laws. in the area of facial recognition, consumers cannot expect what is happening right now. i do not know it is possible to educate millions of millions of consumers as would need to be
10:26 am
educated to actually have this be meaningful. we are going to have to look hard at collection and use and all of the different things we have to talk about regarding information practices in order to see with the middle ground is here. there is a whole lot of bell curve in the area of policy making. there is the currency side with people have a thick use 9-a teacher lithosphere of the technology -- people have a diffuse, non-articulate fear of technology. we are not going to be able to stop the deployment of this technology. the question is, what is the next step. >> you worked pretty extensively
10:27 am
on facial recognition best practices, something the organization has been talking about for a while now. about that as you sketch out the perspective. >> i do not want to represent the hill perspective. i need to stress that anything i say on the panel should be considered my personal opinion and thoughts, not the congresswoman for him i work. the current technology is quite sophisticated. it is entirely possible to set up using low-cost technology a network that can identify large numbers of people and track them as they go from place to place. he can search for information including their names, shopper profiles. he same facial recognition systems using a existing technologies can use things not
10:28 am
associated with your facial geometry. . there are some more cutting edge ones that are already operational that can pick up on your emotional state that with the are anxious or happy. some can pick up on your health space and can measure the blood level on your face to determine your blood pressure, your blood oxygen levels. the privacy implications are massive and consumers will want to be notified and have a choice as to whether or not they are being included in these systems. my impression is that many of the business is currently using technologies like this are giving them this choice. it is challenging by the very nature of the technology.
10:29 am
any official recognition discussion that fails to include this is incomplete and is doomed to irrelevancy in a matter of years. renovation will need to be harmonized with our report and write -- our important rights. that is a challenging privacy issues with regard to privacy today. we worked independently onsets of regulatory -- self regulatory guidelines for businesses. they are quite consistent. our recognition of the best recommendation was that you would need an individual in more than one place. you let consumers know it is happening. it is not the best solution,, but it is the only solution that
10:30 am
is out there right now it is privacy by walking away. there are offline retail settings. there is not that much data out there we urge businesses to respect the privacy of individuals.
10:31 am
i have seen no literature doesn't just a good way to do that. >> i want to make sure we have time. give us your perspective and a little bit about what you do. we are one of the leaders in facial recognition. the technology has really improved over the last 15 years. our customers are primarily government. we do have commercial customers. it is largely forensic. these days, it is where all of the investment is going to. identify the bank robber from the photograph. for the forensic context, just as there is in the automated fingerprint systems that we see on tv, the with actually were is that you often get a selection
10:32 am
of possibilities and then there is a human element to arrive at a decision. he him and feel better at recognizing these patterns. there is a process in the law enforcement area that i am uncomfortable. i would like to add another technology to this panel. iris and face go together. cameras are getting better and better. you can film movies on an slr.
10:33 am
there may be a message here that the company said the lawyers. we are keen on having the rules because it is in our interest not to have this technology become anathema to our customers. we want to continue to profit. the rules in place can work. one example is something we are all aware of, wiretapping. here is the fourth amendment and here is the process that the d.o.j. where i once worked that restrict that activity here it is also restricted by private citizens. if you not follow the rules, you can find yourself in a situation like news corp.. anyone can learn how to wire tap
10:34 am
someone else's phone with a short tutorial. you have to have the rules to govern contact. >> we expect a wide capability. thus the ftc have the legal tools needed in this space. the larger question does ryan. does the ftc have the tools it needs and how does it use those tools. >> the basic tools we have allows us to read the section and on fairness. the deception is clear. if you make a promise, we are going to take this data of human and use it for these
10:35 am
purposes. that is a pretty clear deception. we have brought a lot of cases for failure to secure data sufficiently and take reasonable precautions. if a company collects a whole database of people's faces and does not secure that well and it gets access and abuse, we can say, there is consumer harm. you did not meet a promise. we define carefully what types of firms are unfair. they have to be substantial. it cannot be spent to the test. they particularly involve financial health, safety, medical information about
10:36 am
children and intrusions into the home. when you start to think abbas a lot of these issues that provides a pretty good framework. we brought a case last fall against a company called designer wear. they worked which led to a own computers that had its programs -- had a program that could turn on the camera and take pictures of what was happening around them and send it back to the rent to own store. we challenge that under the section and unfairness. some of the screen shot was financial data and some of and was people's private activity in the bedrooms being captured and sent back. we thought that already met our standard of what is unfair. i like to be sure we is using
10:37 am
it the way we need it. it gives us a lot of flexibility to address some of these. >> that is a question i want to pose to you. is section 5 authority enough or are there holes? does their need to be something that takes it a step further to get a lot that we see in the states like illinois that spelled out what you can and cannot do with facial recognition technology. >> i have no doubt that the commissioner thinks carefully about these things and is prudent about how far it can properly be stretched. i cannot say the same for the majority on the commission. the ftc before she got there has clearly been changing its fundamental role. is chairman and the folks running the bureau of consumer protection think of the
10:38 am
commission as the nation's largest public interest law firm. they have been pushing the boundaries of the law in two ways. stretching the boundaries of unfairness and putting out a series of reports and otherwise using what was called agency threats. they have continued to pressure companies to do things that are normally voluntary. i am deeply concerned about where the agency is going with their authority. unfair this is the right standard in dealing with these basic problems. it will be stressed too fire and the agency is making non-law law in the process of putting out these reports and issuing these recent decrees and settling cases that never go before the courts. i will close by saying we finally have pending litigation
10:39 am
that may actually resolve some of these questions. it was referred to the commission a series of data security cases. that is clearly the heart of the facial recognition courts. hotels have challenge to that argument that the commission has failed to establish that those practices are unfair. if it comes out in the way that it might, that could shake the foundations of the f.t.c. law in this area. i am and not against legislation. it may be that we do not want to stretch section 5 too 5. >> i want to change the question a little bit. we are talking about what happens when companies screw up entirely.
10:40 am
what happens when companies are collecting the information or when they store that information is the notice and consent regime efficient enough is that where congress needs to get involved >> -- involved? >> facial recognition technology where you are actually can be a person and identify them and using that for some purpose with the law starts to come in is forced notice. this is in adequate at this point in this country. discrimination is a better tool.
10:41 am
the equal opportunity act is for this sort of thing. if someone is being offered a higher price or a less favorable terms on the basis of special recognition in a commercial setting, that is a problem. there are also law-enforcement government issues. there is one competition factor here is the new released regulations. copa now includes photographs of children. we do not know how children are being captured and identified in this country. in japan, is already being done.
10:42 am
i do not think it is being done here yet. if that ever happens, there will be a real problem that would be clear a kid walks into a store and the tax -- it is detected that this is a kid and changes on ghana and our marketing because there is a child under the age of 13 there. this is a case challenging -- this is a challenging proposition on many levels. i think it has become a real issue. where are the real harm is here. the real harm has to be marketed. we need to find out what is really improper use of this data and with the real risk is so we do not put our attention on places where the real harm is not. we need a lot more work to a
10:43 am
identify the substance of firms. >> your original question. i do not think the ftc act is sufficient. i would like to see baseline consumer privacy legislation. this is not just about the harms that facial recognition can cause. harm at the ftc -- nor is it was a the spiders are coming in under your door. i tried to sketch out the capability of the technology went into the train when we can identify people and pick up on their emotional states and the health information consumers will want to have a choice on whether they participate in this. it wants to know whether or not they participate in this. their participation may not result in a harm. the privacy impact is massive. this can completely change the
10:44 am
dynamic of privacy in public. unless there are rules and look beyond the home, consumers will have no choice. this system can be out there and customers will be in it and they cannot get out. the dynamic of privacy in public will be completely different. there is a rule for government in setting some limits and articulating some of the boundaries for the use of technology like that. >> just briefly, i appreciate where he is coming from. keep in mind that privacy is a competitive features. that is one of the reasons why you mentioned you do not see a lot of abuses yet. there is not self regulation. nobody has that luxury. if you are operating in the technology space, you are not operating alone. you have upstream customers and downstream customers. there are a lot of sources
10:45 am
against any company in the economy. i am not at the point where i think marketing cows as a harm except with respect to children -- counts as a harm except with respect to children. do not look at the chicken when there is a key reps coming up. i am concerned about the collection -- when there is a tyrannosaurus rex coming up. how do we wall it off? we have to build up those claims. >> i want to talk about self regulation. we have about five minutes. >> the answer is targeted legislation focusing on usage. conceptually, we are talking about competition. what if stores and places u.n. in the business world simply use structured data -- what if stores and places you went in
10:46 am
the world used simply structured data? that is the kind of technological change that would enable the kind of competition on privacy we need to talk about. focusing on usage in focusing on proving -- improving transparency. >> let's talk about self regulation. facial recognition with respect to digital is one of those things. do we think self regulatory efforts have worked? having fallen by the wayside. we have the little code of conduct. we have the little blue icon. when we are talking about facial recognition and digital science, is that form of notice effective temple what needs to be done to strengthen that sex -- self-regulatory regime.
10:47 am
>> we mentioned several self- regulatory programs. if a company says we adhere to these guidelines and they do not, that is a cut and dried case. the ftc can provide backup in that area. as to whether the companies are complying and whether there is a big problem, a lot of this is in the early stages. >> i am it really proud of the work we did on our self- regulatory regime. we were just talking. in 2010 we read in las vegas announcing those. we were on the same panel in the same -- in 2010, we were in las
10:48 am
vegas announcing those. sulfur realization -- self regulation has an an important piece. one of the big issues i have right now with the with the digital signs have been put into effect is said we are having trouble on the notice front. notice is a measure of fairness in the privacy. it is kind of koo koo. who cares about notice. i care about not just a lot. is it not like the walkout or out cows. i watched it on a regular basis.
10:49 am
the-i did not like to walk out or the out out. -- i do not like he walked out or the opt out. it is going to be employed and how we noticed the ubiquitous census. we are not doing well on the notice front. >> there is not a lot of data on whether or not the self regulatory guidelines are being followed in full. at least the digital sign federation have issued a seal of compliance that some companies are using. some companies have a stress that they are in full compliance of the privacy guidelines. it is extremely encourages -- encouraging that companies want to get on board with the

Face Recognition Technology
CSPAN February 2, 2013 10:00am-10:50am EST

Series/Special. Balancing security and privacy. New.

TOPIC FREQUENCY Us 8, Ftc 3, U.s. 3, Guinness 2, Britain 2, Ryan 1, Guiness 1, Panasonic 1, Mantle 1, Joyner 1, Clinton 1, Taylor 1, Lofgren 1, Tailo Specialr 1, John Anderton 1, U.n. 1, Nikki Haley 1, Ron Wyden 1, Walgreen 1, China 1
Network CSPAN
Duration 00:50:00
Scanned in San Francisco, CA, USA
Source Comcast Cable
Tuner Channel 17 (141 MHz)
Video Codec mpeg2video
Audio Cocec ac3
Pixel width 704
Pixel height 480
Sponsor Internet Archive
Audio/Visual sound, color

disc Borrow a DVD of this show
info Stream Only
Uploaded by
TV Archive
on 2/2/2013